Newton-based optimization for Kullback-Leibler nonnegative tensor factorizations

نویسندگان

  • Samantha Hansen
  • Todd Plantenga
  • Tamara G. Kolda
چکیده

Tensor factorizations with nonnegative constraints have found application in analyzing data from cyber traffic, social networks, and other areas. We consider application data best described as being generated by a Poisson process (e.g., count data), which leads to sparse tensors that can be modeled by sparse factor matrices. In this paper we investigate efficient techniques for computing an appropriate canonical polyadic tensor factorization based on the Kullback-Leibler divergence function. We propose novel subproblem solvers within the standard alternating block variable approach. Our new methods exploit structure and reformulate the optimization problem as small independent subproblems. We employ bound-constrained Newton and quasi-Newton methods. We compare our algorithms against other codes, demonstrating superior speed for high accuracy results and the ability to quickly find sparse solutions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Newton-Based Optimization for Nonnegative Tensor Factorizations

Tensor factorizations with nonnegative constraints have found application in analyzing data from cyber traffic, social networks, and other areas. We consider application data best described as being generated by a Poisson process (e.g., count data), which leads to sparse tensors that can be modeled by sparse factor matrices. In this paper we investigate efficient techniques for computing an app...

متن کامل

Tackling Box-Constrained Optimization via a New Projected Quasi-Newton Approach

Numerous scientific applications across a variety of fields depend on box-constrained convex optimization. Box-constrained problems therefore continue to attract research interest. We address box-constrained (strictly convex) problems by deriving two new quasi-Newton algorithms. Our algorithms are positioned between the projected-gradient [J. B. Rosen, J. SIAM, 8(1), 1960, pp. 181–217], and pro...

متن کامل

Kullback-Leibler Divergence for Nonnegative Matrix Factorization

The I-divergence or unnormalized generalization of KullbackLeibler (KL) divergence is commonly used in Nonnegative Matrix Factorization (NMF). This divergence has the drawback that its gradients with respect to the factorizing matrices depend heavily on the scales of the matrices, and learning the scales in gradient-descent optimization may require many iterations. This is often handled by expl...

متن کامل

Kullback-Leibler Principal Component for Tensors is not NP-hard

We study the problem of nonnegative rank-one approximation of a nonnegative tensor, and show that the globally optimal solution that minimizes the generalized Kullback-Leibler divergence can be efficiently obtained, i.e., it is not NP-hard. This result works for arbitrary nonnegative tensors with an arbitrary number of modes (including two, i.e., matrices). We derive a closed-form expression fo...

متن کامل

Weighted Nonnegative Matrix Factorization and Face Feature Extraction

In this paper we consider weighted nonnegative matrix factorizations and we show that the popular algorithms of Lee and Seung can incorporate such a weighting. We then prove that for appropriately chosen weighting matrices, the weighted Euclidean distance function and the weighted generalized Kullback-Leibler divergence function are essentially identical. We finally show that the weighting can ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Optimization Methods and Software

دوره 30  شماره 

صفحات  -

تاریخ انتشار 2015